Kernel ellipsoidal trimming

نویسندگان

  • Alexander N. Dolia
  • Christopher J. Harris
  • John Shawe-Taylor
  • D. M. Titterington
چکیده

Ellipsoid estimation is important in many practical areas such as control, system identification, visual/audio tracking, experimental design, data mining, robust statistics and statistical outlier or novelty detection. A new method, called Kernel Minimum Volume Covering Ellipsoid (KMVCE) estimation, that finds an ellipsoid in a kernel-defined feature space is presented. Although the method is very general and can be applied to many of the aforementioned problems, the main focus is on the problem of statistical novelty/outlier detection. A simple iterative algorithm based on Mahalanobis-type distances in the kernel-defined feature space is proposed for practical implementation. The probability that a non-outlier is misidentified by our algorithms is analysed using bounds based on Rademacher complexity. The KMVCE method performs very well on a set of real-life and simulated datasets, when compared with standard kernel-based novelty detection methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

UNIVERSITY OF SOUTHAMPTON Kernel Ellipsoidal Trimming T 8 . 11 . 10 - 01 /

Ellipsoid estimation is an issue of primary importance in many practical areas such as control, system identification, visual/audio tracking, experimental design, data mining, robust statistics and novelty/outlier detection. This paper presents a new method of kernel information matrix ellipsoid estimation (KIMEE) that finds an ellipsoid in a kernel defined feature space based on a centered inf...

متن کامل

Robust nonparametric kernel regression estimator

In robust nonparametric kernel regression context,weprescribemethod to select trimming parameter and bandwidth. Through solving estimating equations, we control outlier effect through combining weighting and trimming. We show asymptotic consistency, establish bias, variance properties and derive asymptotics. © 2016 Elsevier B.V. All rights reserved.

متن کامل

Optimal Spherical Separability: Towards Optimal Kernel Design

In this research paper, the concept of hyperspherical/hyper-ellipsoidal separability is introduced. Method of arriving at the optimal hypersphere (maximizing margin) separating two classes is discussed. By projecting the quantized patterns into higher dimensional space (as in encoders of error correcting code), the patterns are made hyper-spherically separable. Single/multiple layers of spheric...

متن کامل

KPCA-based training of a kernel fuzzy classifier with ellipsoidal regions

In a fuzzy classifier with ellipsoidal regions, a fuzzy rule, which is based on the Mahalanobis distance, is defined for each class. Then the fuzzy rules are tuned so that the recognition rate of the training data is maximized. In most cases, one fuzzy rule per one class is enough to obtain high generalization ability. But in some cases, we need to partition the class data to define more than o...

متن کامل

Resistant Dimension Reduction

Existing dimension reduction (DR) methods such as ordinary least squares (OLS) and sliced inverse regression (SIR) often perform poorly in the presence of outliers. Ellipsoidal trimming can be used to create outlier resistant DR methods that can also give useful results when the assumption of linearly related predictors is violated. Theory for SIR and OLS is reviewed, and it is shown that sever...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 52  شماره 

صفحات  -

تاریخ انتشار 2007